Dynamic Thick Restarting of the Davidson, and the Implicitly Restarted Arnoldi Methods

نویسندگان

  • Andreas Stathopoulos
  • Yousef Saad
  • Kesheng Wu
چکیده

The Davidson method is a popular preconditioned variant of the Arnoldi method for solving large eigenvalue problems. For theoretical, as well as practical reasons the two methods are often used with restarting. Frequently, information is saved through approximated eigen-vectors to compensate for the convergence impairment caused by restarting. We call this scheme of retaining more eigenvectors than needed`thick restarting', and prove that thick restarted, non-preconditioned Davidson is equivalent to the implicitly restarted Arnoldi. We also establish a relation between thick restarted Davidson, and a Davidson method applied on a deeated system. The theory is used to address the question of which and how many eigenvectors to retain and motivates the development of a dynamic thick restarting scheme for the symmetric case, which can be used in both Davidson and implicit restarted Arnoldi. Several experiments demonstrate the eeciency and robustness of the scheme.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Thick-Restart Lanczos Method for Symmetric Eigenvalue Problems

For real symmetric eigenvalue problems, there are a number of algorithms that are mathematically equivalent, for example, the Lanczos algorithm, the Arnoldi method and the unpreconditioned Davidson method. The Lanczos algorithm is often preferred because it uses signiicantly fewer arithmetic operations per iteration. To limit the maximum memory usage, these algorithms are often restarted. In re...

متن کامل

Restarting Techniques for the (jacobi-)davidson Symmetric Eigenvalue Methods

The (Jacobi-)Davidson method, which is a popular preconditioned extension to the Arnoldi method for solving large eigenvalue problems, is often used with restarting. This has significant performance shortcomings, since important components of the invariant subspace may be discarded. One way of saving more information at restart is through “thick” restarting, a technique that involves keeping mo...

متن کامل

Some new restart vectors for explicitly restarted Arnoldi method

The explicitly restarted Arnoldi method (ERAM) can be used to find some eigenvalues of large and sparse matrices. However, it has been shown that even this method may fail to converge. In this paper, we present two new methods to accelerate the convergence of ERAM algorithm. In these methods, we apply two strategies for the updated initial vector in each restart cycles. The implementation of th...

متن کامل

Filtering En Restarting Orthogonal Projection Methods Filtering En Restarting Orthogonal Projection Methods

We consider the class of the Orthogonal Projection Methods (OPM) to solve iteratively large and generalised eigenvalue problems. An OPM is a method that projects a large eigenvalue problem on a smaller subspace. In this subspace, an approximation of the eigenvalue spectrum can be computed from a small eigenvalue problem using a direct method. We show that many iterative eigenvalue solvers, such...

متن کامل

Nested Lanczos : Implicitly Restarting a Lanczos Algorithm Nested Lanczos : Implicitly Restarting a Lanczos Algorithm

In this text, we present a generalisation of the idea of the Implicitly Restarted Arnoldi method to the nonsymmetric Lanczos algorithm, using the two-sided Gram-Schmidt process or using a full Lanczos tridi-agonalisation. The Implicitly Restarted Lanczos method can be combined with an implicit lter. It can also be used in case of breakdown and ooers an alternative for look-ahead.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • SIAM J. Scientific Computing

دوره 19  شماره 

صفحات  -

تاریخ انتشار 1998